3,278 research outputs found

    Triple gauge couplings in polarised e-e+ -> W-W+ and their measurement using optimal observables

    Full text link
    The sensitivity of optimal integrated observables to electroweak triple gauge couplings is investigated for the process e-e+ -> W-W+ -> 4 fermions at future linear colliders. By a suitable reparameterisation of the couplings we achieve that all 28 coupling parameters have uncorrelated statistical errors and are naturally normalised for this process. Discrete symmetry properties simplify the analysis and allow checks on the stability of numerical results. We investigate the sensitivity to the couplings of the normalised event distribution and the additional constraints that can be obtained from the total rate. Particular emphasis is put on the gain in sensitivity one can achieve with longitudinal beam polarisation. We also point out questions that may best be settled with transversely polarised beams. In particular we find that with purely longitudinal polarisation one linear combination of coupling parameters is hardly measurable by means of the normalised event distribution.Comment: 56 pages, 20 figure

    Refractive X-ray beam shaping

    Get PDF
    This work introduces new refractive illumination optics in the hard X-ray region and describes a method for overcoming fabrication limitations of X-ray depth lithography. In particular, the problem of high aspect ratio in X-ray prism lenses was addressed. The refractive X-ray optics are developed for the photon energy range 8-100 keV. In the following, we report the development of a principal new focusing optics with large aperture, an illumination condenser for full-field X-ray microscopy and a so-called beam shaping optics to overcome the limitation of the field of view at 3rd and 4th generation synchrotron sources. To reduce the absorption of X-rays in the material of the optical systems, the approach of X-ray prism lenses was pursued. Here, the optics consist of rows of micro prisms with an edge length of about 20 µm, which deflect the incident rays. This improves the ratio of the refractive power of the optics to the volume of the absorbing lens material. The mechanical stability of the fragile, very tall micro prisms is achieved by exposing thin, stabilizing support planes. In order to achieve focal sizes smaller than the prism edge lengths, double parabolic biconcave micro-lenses were added to the prism rows. A similar arrangement with biconvex micro-lenses was used to achieve beam expansion while simultaneously homogenizing the illumination of the image field of a full-field X-ray microscope. Beam shaping optics consisting of kinoform Fresnel lens elements were developed for vertical beam expansion at high brilliance synchrotron sources. In all cases, the theory is based on geometrical optics and ray tracing simulations. The optics were produced via deep X-ray lithography using the synchrotron radiation source at KIT at the LIGA I and II beamlines. The lens material is the negative resist mr-X, an epoxy resin-based polymer of type SU-8. The lenses were characterized at PETRA III, DESY in Hamburg and at ESRF in Grenoble

    Subtraction-noise projection in gravitational-wave detector networks

    Get PDF
    In this paper, we present a successful implementation of a subtraction-noise projection method into a simple, simulated data analysis pipeline of a gravitational-wave search. We investigate the problem to reveal a weak stochastic background signal which is covered by a strong foreground of compact-binary coalescences. The foreground which is estimated by matched filters, has to be subtracted from the data. Even an optimal analysis of foreground signals will leave subtraction noise due to estimation errors of template parameters which may corrupt the measurement of the background signal. The subtraction noise can be removed by a noise projection. We apply our analysis pipeline to the proposed future-generation space-borne Big Bang Observer (BBO) mission which seeks for a stochastic background of primordial GWs in the frequency range 0.11\sim 0.1-1 Hz covered by a foreground of black-hole and neutron-star binaries. Our analysis is based on a simulation code which provides a dynamical model of a time-delay interferometer (TDI) network. It generates the data as time series and incorporates the analysis pipeline together with the noise projection. Our results confirm previous ad hoc predictions which say that BBO will be sensitive to backgrounds with fractional energy densities below Ω=1016\Omega=10^{-16}Comment: 54 pages, 15 figure

    Treatment of chyloperitoneum after extended lymphatic dissection during duodenopancreatectomy

    Get PDF
    Summary: Background. Chyloperitoneum is a rare postoperative complication that might be caused by an interruption of chylous ducts in the mesenteric root or the cysterna chyli. Two cases of chyloperitoneum after duodenopancreatectomy are reported in the literature. Methods. We here report the third case that developed a chyloperitoneum 2 wk postoperatively when he resumed his normal diet. Results. The patient was treated conservatively with paracenteses and chyloperitoneum subsided thereafter. Conclusions. Chyloperitoneum after extended duodenopancreatectomy might be treated conservativel

    Composite used for thermal spray instrumentation and method for making the same

    Get PDF
    A superalloy article which comprises a substrate comprised of a superalloy, a bond coat comprised of MCrAlY wherein M is a metal selected from the group consisting of cobalt, nickel and mixtures thereof applied onto at least a portion of the substrate and a ceramic top coat applied over at least a portion of the bond coat. The bond coat is exposed to a temperature of within the range of between about 1600-1800.degree. F. subsequent to its application onto the substrate

    The structure group for quasi-linear equations via universal enveloping algebras

    Full text link
    We consider the approach of replacing trees by (fewer) multi-indices as an index set of the abstract model space T\mathsf{T} to tackle quasi-linear singular SPDEs. We show that this approach is consistent with the postulates of regularity structures when it comes to the structure group G\mathsf{G}. In particular, GAut(T)\mathsf{G}\subset{\rm Aut}(\mathsf{T}) arises from a Hopf algebra T+\mathsf{T}^+ and a comodule Δ ⁣:TT+T\Delta\colon\mathsf{T}\rightarrow \mathsf{T}^+\otimes\mathsf{T}. This approach allows to interpret GAut(T)\mathsf{G}^*\subset{\rm Aut}(\mathsf{T}^*) as a Lie group arising from a Lie algebra LEnd(T)\mathsf{L} \subset{\rm End}(\mathsf{T}^*) consisting of derivations; these derivations in turn are the infinitesimal generators arising from actions on the space of pairs (nonlinearities, functions of space-time). The Hopf algebra T+\mathsf{T}^+ arises from a coordinate representation of the universal enveloping algebra U(L){\rm U}(\mathsf{L}) of the Lie algebra L\mathsf{L}. The coordinates arise from an underlying pre-Lie algebra structure of L\mathsf{L}. Strong finiteness properties, which are enforced by gradedness and the restrictive definition of T\mathsf{T}, allow to define these structures in our infinite-dimensional setting.Comment: 50 page

    Hedge Fund Characteristics and Performance Persistence

    Get PDF

    Creutzfeldt-Jakob disease and homocysteine levels in plasma and cerebrospinal fluid

    Get PDF
    Background: There is evidence that homocysteine contributes to various neurodegenerative disorders. Objective: To assess the values of homocysteine in patients with Creutzfeldt-Jakob disease (CJD) in both cerebrospinal fluid (CSF) and plasma. Methods: Study design: Case control study. Total homocysteine was quantified in CSF and plasma samples of CJD patients (n = 13) and healthy controls (n = 13). Results: Mean values in healthy controls: 0.15 mumol/l +/- 0.07 (CSF) and 9.10 mumol/l +/- 2.99 (plasma); mean values in CJD patients: 0.13 mumol/l +/- 0.03 (CSF) and 9.22 mumol/l +/- 1.81 (plasma). No significant differences between CJD patients and controls were observed (Mann-Whitney U, p > 0.05). Conclusions: The results indicate that the CSF and plasma of CJD patients showed no higher endogenous levels of homocysteine as compared to normal healthy controls. These findings provide no evidence for an additional role of homocysteine in the pathogenetic mechanisms underlying CJD neurodegeneration. Copyright (C) 2005 S. Karger AG, Basel

    Scalability of Distributed Version Control Systems

    Get PDF
    Source at https://ojs.bibsys.no/index.php/NIK/article/view/434.Distributed version control systems are popular for storing source code, but they are notoriously ill suited for storing large binary files. We report on the results from a set of experiments designed to characterize the behavior of some widely used distributed version control systems with respect to scaling. The experiments measured commit times and repository sizes when storing single files of increasing size, and when storing increasing numbers of single-kilobyte files. The goal is to build a distributed storage system with characteristics similar to version control but for much larger data sets. An early prototype of such a system, Distributed Media Versioning (DMV), is briefly described and compared with Git, Mercurial, and the Git-based backup tool Bup. We find that processing large files without splitting them into smaller parts will limit maximum file size to what can fit in RAM. Storing millions of small files will result in inefficient use of disk space. And storing files with hash-based file and directory names will result in high-latency write operations, due to having to switch between directories rather than performing a sequential write. The next-phase strategy for DMV will be to break files into chunks by content for de-duplication, then re-aggregating the chunks into append-only log files for low-latency write operations and efficient use of disk space

    Scalability of Distributed Version Control Systems

    Get PDF
    Distributed version control systems are popular for storing source code, but they are notoriously ill suited for storing large binary files. We report on the results from a set of experiments designed to characterize the behavior of some widely used distributed version control systems with respect to scaling. The experiments measured commit times and repository sizes when storing single files of increasing size, and when storing increasing numbers of single-kilobyte files. The goal is to build a distributed storage system with characteristics similar to version control but for much larger data sets. An early prototype of such a system, Distributed Media Versioning (DMV), is briefly described and compared with Git, Mercurial, and the Git-based backup tool Bup. We find that processing large files without splitting them into smaller parts will limit maximum file size to what can fit in RAM. Storing millions of small files will result in inefficient use of disk space. And storing files with hash-based file and directory names will result in high-latency write operations, due to having to switch between directories rather than performing a sequential write. The next-phase strategy for DMV will be to break files into chunks by content for de-duplication, then re-aggregating the chunks into append-only log files for low-latency write operations and efficient use of disk space
    corecore